On Unified Generalizations of Relative Jensen–shannon and Arithmetic–geometric Divergence Measures, and Their Properties Pranesh Kumar and Inder Jeet Taneja
نویسندگان
چکیده
Abstract. In this paper we shall consider one parametric generalization of some nonsymmetric divergence measures. The non-symmetric divergence measures are such as: Kullback-Leibler relative information, χ2−divergence, relative J – divergence, relative Jensen – Shannon divergence and relative Arithmetic – Geometric divergence. All the generalizations considered can be written as particular cases of Csiszár’s f-divergence. By putting some conditions on the probability distribution, the aim here is to develop bounds on these measures and their parametric generalizations.
منابع مشابه
Generalized Symmetric Divergence Measures and Metric Spaces
Abstract Recently, Taneja [7] studied two one parameter generalizations of J-divergence, Jensen-Shannon divergence and Arithmetic-Geometric divergence. These two generalizations in particular contain measures like: Hellinger discrimination, symmetric chi-square divergence, and triangular discrimination. These measures are well known in the literature of Statistics and Information theory. In thi...
متن کاملNested Inequalities Among Divergence Measures
In this paper we have considered an inequality having 11 divergence measures. Out of them three are logarithmic such as Jeffryes-Kullback-Leiber [4] [5] J-divergence. Burbea-Rao [1] Jensen-Shannon divergence and Taneja [7] arithmetic-geometric mean divergence. The other three are non-logarithmic such as Hellinger discrimination, symmetric χ−divergence, and triangular discrimination. Three more ...
متن کاملOn Mean Divergence Measures
Abstract. Arithmetic, geometric and harmonic means are the three classical means famous in the literature. Another mean such as square-root mean is also known. In this paper, we have constructed divergence measures based on nonnegative differences among these means, and established an interesting inequality by use of properties of Csiszár’s f-divergence. An improvement over this inequality is a...
متن کاملSome Inequalities Among New Divergence Measures
Abstract There are three classical divergence measures exist in the literature on information theory and statistics. These are namely, Jeffryes-Kullback-Leiber J-divergence. Burbea-Rao [1] Jensen-Shannon divegernce and Taneja [8] arithmetic-geometric mean divergence. These three measures bear an interesting relationship among each other and are based on logarithmic expressions. The divergence m...
متن کاملBounds on Non-Symmetric Divergence Measures in Terms of Symmetric Divergence Measures
There are many information and divergence measures exist in the literature on information theory and statistics. The most famous among them are Kullback-Leibler [13] relative information and Jeffreys [12] Jdivergence. Sibson [17] Jensen-Shannon divergence has also found its applications in the literature. The author [20] studied a new divergence measures based on arithmetic and geometric means....
متن کامل